119 research outputs found

    A survey of Montana\u27s public school building needs and financing ability

    Get PDF

    An investigation of EPO as a tissue protective agent in human kidney transplantation

    Get PDF
    Ischaemia-reperfusion injury (IRI) has been identified as a major contributor to both short and long term kidney transplant failure. Experimental evidence from the literature suggests that Erythropoietin (EPO) is tissue protective, reducing both inflammation and apoptosis following IRI. We performed a randomised, double blind, placebo controlled trial examining the tissue protective effect of high dose EPO (100,000iu over 3 days) in 39 recipients of an extended criteria donor kidney or a non-heart-beating donor kidney. The primary endpoints of the study were difference in plasma and urinary biomarker levels (NGAL, IL-18 and KIM-1) in addition to changes in gene expression. Secondary endpoints included safety, clinical data and differences in metabolomics profiles. There was no difference detected between the treatment groups in terms of biomarkers, gene expression, metabolomics profiling or clinical parameters. No adverse events related to EPO therapy were recorded. In addition, we developed a cell model of kidney transplantation using primary tubulo-epithelial cells and HMEC-1 cells, with which to confirm the protective effects of EPO. Treatment with 50U/ml one hour prior to undergoing cold hypoxia resulted in the maximum degree of tissue protection, as measured using an MTT and an LDH assay. No evidence of EPO toxicity was demonstrated. Tubulo-epithelial cells expressed EPOR mRNA and protein. No CD131 receptor could be demonstrated. In summary, EPO confers tissue protection in a cell model of kidney transplantation but this has not been shown to occur in a clinical trial using high dose EPO in recipients of marginal donor kidneys.EThOS - Electronic Theses Online ServiceDepartment of Renal Medicine, Manchester Royal InfirmaryGBUnited Kingdo

    Improving survival of retinoblastoma in Uganda

    Get PDF
    BACKGROUND: Diagnostic delay results in relatively high mortality among children with retinoblastoma in Uganda, where treatment was limited to surgery and, for some, radiotherapy. In order to improve outcomes, a simple programme of neoadjuvant and adjuvant chemotherapy was introduced. Here we report survival before and after this change to medical practice. METHODS: Affordable standard off-patent chemotherapy agents were administered by trained paramedical staff to groups of patients at the same time. Survival before and after the introduction of chemotherapy was monitored. Between 2006 and 2013 a total of 270 patients with retinoblastoma were included, 181 treated prior to chemotherapy and 89 after (beginning in 2009). We had 94% follow-up and 249 had histological verification of diagnosis. RESULTS: Using a proportional hazards model adjusted for age, sex and laterality, children treated after chemotherapy was introduced had a 37% lower risk of dying (HR 0.63, 95% CI 0.41 to 0.99) compared with children treated before. Prior to the introduction of chemotherapy only 15% of children who survived bilateral disease retained vision after treatment compared with 71% after chemotherapy. CONCLUSIONS: The introduction of chemotherapy proved safe and cost-effective in non-specialist hands and was associated with significant improvements in survival and, among bilateral cases, in preserving vision

    Distinct and dissociable EEG networks are associated with recovery of cognitive function following anesthesia-induced unconsciousness

    Get PDF
    The temporal trajectories and neural mechanisms of recovery of cognitive function after a major perturbation of consciousness is of both clinical and neuroscientific interest. The purpose of the present study was to investigate network-level changes in functional brain connectivity associated with the recovery and return of six cognitive functions after general anesthesia. High-density electroencephalograms (EEG) were recorded from healthy volunteers undergoing a clinically relevant anesthesia protocol (propofol induction and isoflurane maintenance), and age-matched healthy controls. A battery of cognitive tests (motor praxis, visual object learning test, fractal-2-back, abstract matching, psychomotor vigilance test, digital symbol substitution test) was administered at baseline, upon recovery of consciousness (ROC), and at half-hour intervals up to 3 h following ROC. EEG networks were derived using the strength of functional connectivity measured through the weighted phase lag index (wPLI). A partial least squares (PLS) analysis was conducted to assess changes in these networks: (1) between anesthesia and control groups; (2) during the 3-h recovery from anesthesia; and (3) for each cognitive test during recovery from anesthesia. Networks were maximally perturbed upon ROC but returned to baseline 30-60 min following ROC, despite deficits in cognitive performance that persisted up to 3 h following ROC. Additionally, during recovery from anesthesia, cognitive tests conducted at the same time-point activated distinct and dissociable functional connectivity networks across all frequency bands. The results highlight that the return of cognitive function after anesthetic-induced unconsciousness is task-specific, with unique behavioral and brain network trajectories of recovery

    Preventing kidney transplant failure by screening for antibodies against human leucocyte antigens followed by optimised immunosuppression: OuTSMART RCT

    Get PDF
    Design: Investigator-led, prospective, open-labelled marker-based strategy (hybrid) randomised trial. Background: Allografts in 3% of kidney transplant patients fail annually. Development of antibodies against human leucocyte antigens is a validated predictive biomarker of allograft failure. Under immunosuppression is recognised to contribute, but whether increasing immunosuppression can prevent allograft failure in human leucocyte antigen Ab+ patients is unclear. Participants: Renal transplant recipients >ā€‰1 year post-transplantation attending 13 United Kingdom transplant clinics, without specific exclusion criteria. Interventions: Regular screening for human leucocyte antigen antibodies followed, in positive patients by interview and tailored optimisation of immunosuppression to tacrolimus, mycophenolate mofetil and prednisolone. Objective: To determine if optimisation of immunosuppression in human leucocyte antigen Ab+ patients can cost-effectively prevent kidney allograft failure. Outcome: Time to graft failure after 43 months follow-up in patients receiving the intervention, compared to controls, managed by standard of care. Costs and quality-adjusted life-years were used in the cost-effectiveness analysis. Randomisation and blinding: Random allocation (1 : 1) to unblinded biomarker-led care or double-blinded standard of care stratified by human leucocyte antigen antibodies status (positive/negative) and in positives, presence of donor-specific antibodies (human leucocyte antigen antibodies against donor human leucocyte antigen) or not (human leucocyte antigen antibodies against non-donor human leucocyte antigen), baseline immunosuppression and transplant centre. Biomaker-led care human leucocyte antigen Ab+ patients received intervention. Human leucocyte antigen Ab-negative patients were screened every 8 months. Recruitment Began September 2013 and for 37 months. The primary endpoint, scheduled for June 2020, was moved to March 2020 because of COVID-19. Numbers randomised: From 5519 screened, 2037 were randomised (1028 biomaker-led care, 1009 to standard of care) including 198 with human leucocyte antigen antibodies against donor human leucocyte antigen (106 biomaker-led care, 92 standard of care) and 818 with human leucocyte antigens antibodies against non-donor human leucocyte antigen (427 biomaker-led care, 391 standard of care). Numbers analysed: Two patients were randomised in error so 2035 were included in the intention-to-treat analysis. Outcome: The trial had 80% power to detect a hazard ratio of 0.49 in biomarker-led care DSA+ group, >ā€‰90% power to detect hazard ratio of 0.35 in biomarker-led care non-DSA+ group (with 5% type 1 error). Actual hazard ratios for graft failure in these biomarker-led care groups were 1.54 (95% CI: 0.72 to 3.30) and 0.97 (0.54 to 1.74), respectively. There was 90% power to demonstrate non-inferiority of overall biomarker-led care group with assumed hazard ratio of 1.4: This was not demonstrated as the upper confidence limit for graft failure exceeded 1.4: (1.02, 95% CI 0.72 to 1.44). The hazard ratio for biopsy-proven rejection in the overall biomarker-led care group was 0.5 [95% CI: 0.27 to 0.94: p = 0.03]. The screening approach was not cost-effective in terms of cost per quality-adjusted life-year. Harms: No significant differences in other secondary endpoints or adverse events. Limitations: Tailored interventions meant optimisation was not possible in some patients. We did not study pathology on protocol transplant biopsies in DSA+ patients. Conclusions: No evidence that optimised immunosuppression in human leucocyte antigen Ab+ patients delays renal transplant failure. Informing patients of their human leucocyte antigen antibodies status appears to reduce graft rejection. Future work: We need a better understanding of the pathophysiology of transplant failure to allow rational development of effective therapies. Trial registration: This trial is registered as EudraCT (2012-004308-36) and ISRCTN (46157828). Funding: This project was funded by the National Institute for Health and Care Research (NIHR) Efficacy and Mechanism Evaluation programme (11/100/34) and will be published in full in Efficacy and Mechanism Evaluation; Vol. 10, No. 5. See the NIHR Journals Library website for further project information

    Co-ordination of early and late ripening events in apples is regulated through differential sensitivities to ethylene

    Get PDF
    In this study, it is shown that anti-sense suppression of Malus domestica 1-AMINO-CYCLOPROPANE-CARBOXYLASE OXIDASE (MdACO1) resulted in fruit with an ethylene production sufficiently low to be able to assess ripening in the absence of ethylene. Exposure of these fruit to different concentrations of exogenous ethylene showed that flesh softening, volatile biosynthesis, and starch degradation, had differing ethylene sensitivity and dependency. Early ripening events such as the conversion of starch to sugars showed a low dependency for ethylene, but a high sensitivity to low concentrations of ethylene (0.01 Ī¼l lāˆ’1). By contrast, later ripening events such as flesh softening and ester volatile production showed a high dependency for ethylene but were less sensitive to low concentrations (needing 0.1 Ī¼l lāˆ’1 for a response). A sustained exposure to ethylene was required to maintain ripening, indicating that the role of ethylene may go beyond that of ripening initiation. These results suggest a conceptual model for the control of individual ripening characters in apple, based on both ethylene dependency and sensitivity
    • ā€¦
    corecore